174 research outputs found

    Keynote Address

    Get PDF
    Transcript of the keynote address at the 2018 Seattle University School of Law symposium “Singularity: AI and the Law.” The keynote address is presented by Ryan Calo and discusses the current status of artificial intelligence learning, and how this current status is moving toward robotic singularity

    Artificial Intelligence Policy: A Primer and Roadmap

    Get PDF
    Talk of artificial intelligence is everywhere. People marvel at the capacity of machines to translate any language and master any game. Others condemn the use of secret algorithms to sentence criminal defendants or recoil at the prospect of machines gunning for blue, pink, and white-collar jobs. Some worry aloud that artificial intelligence will be humankind’s “final invention.” This essay, prepared in connection with UC Davis Law Review\u27s 50th anniversary symposium, explains why AI is suddenly on everyone\u27s mind and provides a roadmap to the major policy questions AI raises. The essay is designed to help policymakers, investors, technologists, scholars, and students understand the contemporary policy environment around AI at least well enough to initiate their own exploration

    Modeling Through

    Get PDF
    Theorists of justice have long imagined a decision-maker capable of acting wisely in every circumstance. Policymakers seldom live up to this ideal. They face well-understood limits, including an inability to anticipate the societal impacts of state intervention along a range of dimensions and values. Policymakers cannot see around corners or address societal problems at their roots. When it comes to regulation and policy-setting, policymakers are often forced, in the memorable words of political economist Charles Lindblom, to “muddle through” as best they can. Powerful new affordances, from supercomputing to artificial intelligence, have arisen in the decades since Lindblom’s 1959 article that stand to enhance policymaking. Computer-aided modeling holds promise in delivering on the broader goals of forecasting and system analysis developed in the 1970s, arming policymakers with the means to anticipate the impacts of state intervention along several lines—to model, instead of muddle. A few policymakers have already dipped a toe into these waters, others are being told that the water is warm. The prospect that economic, physical, and even social forces could be modeled by machines confronts policymakers with a paradox. Society may expect policymakers to avail themselves of techniques already usefully deployed in other sectors, especially where statutes or executive orders require the agency to anticipate the impact of new rules on particular values. At the same time, “modeling through” holds novel perils that policymakers may be ill-equipped to address. Concerns include privacy, brittleness, and automation bias of which law and technology scholars are keenly aware. They also include the extension and deepening of the quantifying turn in governance, a process that obscures normative judgments and recognizes only that which the machines can see. The water may be warm but there are sharks in it. These tensions are not new. And there is danger in hewing to the status quo. (We should still pursue renewable energy even though wind turbines as presently configured waste energy and kill wildlife.) As modeling through gains traction, however, policymakers, constituents, and academic critics must remain vigilant. This being early days, American society is uniquely positioned to shape the transition from muddling to modeling

    Modeling Through

    Get PDF
    Theorists of justice have long imagined a decision-maker capable of acting wisely in every circumstance. Policymakers seldom live up to this ideal. They face well-understood limits, including an inability to anticipate the societal impacts of state intervention along a range of dimensions and values. Policymakers see around corners or address societal problems at their roots. When it comes to regulation and policy-setting, policymakers are often forced, in the memorable words of political economist Charles Lindblom, to “muddle through” as best they can. Powerful new affordances, from supercomputing to artificial intelligence, have arisen in the decades since Lindblom’s 1959 article that stand to enhance policymaking. Computer-aided modeling holds promise in delivering on the broader goals of forecasting and systems analysis developed in the 1970s, arming policymakers with the means to anticipate the impacts of state intervention along several lines—to model, instead of muddle. A few policymakers have already dipped a toe into these waters, others are being told that the water is warm. The prospect that economic, physical, and even social forces could be modeled by machines confronts policymakers with a paradox. Society may expect policymakers to avail themselves of techniques already usefully deployed in other sectors, especially where statutes or executive orders require the agency to anticipate the impact of new rules on particular values. At the same time, “modeling through” holds novel perils that policymakers may be ill equipped to address. Concerns include privacy, brittleness, and automation bias, all of which law and technology scholars are keenly aware. They also include the extension and deepening of the quantifying turn in governance, a process that obscures normative judgments and recognizes only that which the machines can see. The water may be warm, but there are sharks in it. These tensions are not new. And there is danger in hewing to the status quo. As modeling through gains traction, however, policymakers, constituents, and academic critics must remain vigilant. This being early days, American society is uniquely positioned to shape the transition from muddling to modeling

    Can Americans Resist Surveillance?

    Get PDF
    This Essay analyzes the ability of everyday Americans to resist and alter the conditions of government surveillance. Americans appear to have several avenues of resistance or reform. We can vote for privacy-friendly politicians, challenge surveillance in court, adopt encryption or other technologies, and put market pressure on companies not to cooperate with law enforcement. In practice, however, many of these avenues are limited. Reform-minded officials lack the capacity for real oversight. Litigants lack standing to invoke the Constitution in court. Encryption is not usable and can turn citizens into targets. Citizens can extract promises from companies to push back against government surveillance on their behalf but have no recourse if these promises are not enforced. By way of method, this Essay adopts Professor James Gibson’s influential theory of affordances. Originating in psychology, and famous everywhere but in law, affordance theory has evolved into a general method of inquiry with its own useful vocabulary and commitments. This Essay leverages these concepts to lend structure to an otherwise-haphazard inquiry into the capabilities of citizens to perceive and affect surveillance. This Essay contributes to affordance theory by insisting that law itself represents an important affordance

    Privacy Harm Exceptionalism

    Get PDF
    “Exceptionalism” refers to the belief that a person, place, or thing is qualitatively different from others in the same basic category. Thus, some have spoken of America’s exceptionalism as a nation. Early debates about the Internet focused on the prospect that existing laws and institutions would prove inadequate to govern the new medium of cyberspace. Scholars have made similar claims about other areas of law. The focus of this short essay is the supposed exceptionalism of privacy. Rather than catalogue all the ways that privacy might differ from other concepts or areas of study, I intend to focus on the narrow but important issue of harm. I will argue that courts and some scholars require a showing of harm in privacy out of proportion with other areas of law. Many also assume, counterintuitively, that the information industry somehow differs from virtually every other industry in generating no real externalities

    Communications Privacy for and by Whom?

    Get PDF

    Robotics and the Lessons of Cyberlaw

    Get PDF
    Two decades of analysis have produced a rich set of insights as to how the law should apply to the Internet’s peculiar characteristics. But, in the meantime, technology has not stood still. The same public and private institutions that developed the Internet, from the armed forces to search engines, have initiated a significant shift toward developing robotics and artificial intelligence. This Article is the first to examine what the introduction of a new, equally transformative technology means for cyberlaw and policy. Robotics has a different set of essential qualities than the Internet and accordingly will raise distinct legal issues. Robotics combines, for the first time, the promiscuity of data with the capacity to do physical harm; robotic systems accomplish tasks in ways that cannot be anticipated in advance; and robots increasingly blur the line between person and instrument. Robotics will prove “exceptional” in the sense of occasioning systematic changes to law, institutions, and the legal academy. But we will not be writing on a clean slate: many of the core insights and methods of cyberlaw will prove crucial in integrating robotics and perhaps whatever technology follows

    Privacy, Vulnerability, and Affordance

    Get PDF
    This essay begins to unpack the complex, sometimes contradictory relationship between privacy and vulnerability. I begin by exploring how the law conceives of vulnerability — essentially, as a binary status meriting special consideration where present. Recent literature recognizes vulnerability not as a status but as a state — a dynamic and manipulable condition that everyone experiences to different degrees and at different times. I then discuss various ways in which vulnerability and privacy intersect. I introduce an analytic distinction between vulnerability rendering, i.e., making a person more vulnerable, and the exploitation of vulnerability whether manufactured or native. I also describe the relationship between privacy and vulnerability as a vicious or virtuous circle. The more vulnerable a person is — due to poverty, for instance — the less privacy they tend to enjoy; meanwhile, a lack of privacy opens the door to greater vulnerability and exploitation. Privacy can protect against vulnerability but it can also be invoked to engender it. I next describe how privacy supports the creation and exploitation of vulnerability in ways literal, rhetorical, and conceptual. An abuser may literally use privacy to hide his abuse from law enforcement. A legislature or group may invoke privacy rhetorically to justify discrimination, for instance, against transgender individuals who wish to use the bathroom consistent with their gender identity. And courts obscure vulnerability conceptually when they decide a case on the basis of privacy instead of the value that is more centrally at stake. Finally, building on previous work, I offer James Gibson’s theory of affordances as a theoretical lens by which to analyze the complex relationship that privacy mediates. Privacy understood as an affordance permits a more nuanced understanding of privacy and vulnerability and could perhaps lead to wiser privacy law and policy

    Communications Privacy for and by Whom?

    Get PDF
    A response to Professor Orin Kerr\u27s The Next Generation Communications Privacy Act, which makes a series of quiet assumptions, however, that readers may find controversial. First, the Article reads as though ECPA exists only to protect citizens from public officials. According to its text and to case law, however, ECPA also protects private citizens from one another in ways any new act should revisit. Second, the Article assumes that society should address communications privacy with a statute, whereas specific experiences with ECPA suggest that the courts may be better suited to address communications privacy—for reasons Professor Kerr himself offers. Finally, the Article addresses ECPA in isolation from the Foreign Intelligence Surveillance Act of 1978 (FISA), which seems strange in light of revelations that our government systematically intercepts and stores its citizens’ electronic communications under FISA’s auspices Put another way, The Next Generation Communications Privacy Act article succeeds marvelously on its own terms, but not necessarily on everyone else’s. Worse still, we do not benefit from Professor Kerr’s powerful insights regarding the issues he omits
    • …
    corecore